Paul Hagstrom



Introduction


Paul Hagstrom is a fine example of the realities of computer-related learning in the 80s. Self-directed, supported with hardware just enough to let him take the next step; and those steps led to a reinforcing cycle of exploration and education. Not everyone learned computers and then turned that into a career. Paul is an associate professor of Linguistics at Boston University, which doesn't have much to do with computers; and a co-founder of a podcast about the Apple ///, Drop /// Inches, which proves learning persists.

Paul's Story


  • My first encounter with computers was when I was in fifth grade. The school had a single Apple II plus, but its only role in the curriculum was a small set of projects one had the option of undertaking that taught very simple BASIC programming. I took that option, and liked the process of programming so much that I continued to spend most of my free time on it. I did not at the time have much competition for the machine, to the extent that the school actually loaned it to me over the following summer when school was not in session. Apart from that initial introduction, I was completely self-taught using the manuals, some magazines of the time, a few books on programming. There were a couple of teachers a couple of grades later that were interested in my programming projects, and gave me slightly more access to equipment than other students had in order to help me along, and that made a difference as well, certainly. As well as starting to have more interactions with a few other students who by then also turned out to enjoy programming and computers. I can't pinpoint exactly what drew me to it, but it was probably partly the sense of exploration and the challenge, the ability to figure out new things to do with it, have ideas about how to improve programs I'd written and puzzle through how to implement them. It was also something entirely under my control, if I told it what to do correctly, it would do exactly what I told it, predictably. But for whatever reason, it captured my attention quite thoroughly, and soon after the summer with the borrowed computer, our family got one, which I dominated until I eventually got my own. This stuck with me, I continued to be quite involved in computing ever since, although it wasn't where my career actually went. These days, I have gone back to become involved in trying to archive what remains of that time in the history of computing, which turns out to have been very important in a much wider context, although for me at the time, it was just what being a kid was about.

  • I was not the only one to have done that optional programming project in fifth grade, but it affected me quite a bit more than it seemed to have affected the others who did. It might just be that I was lucky in having noticed the potential in programming, beyond just the content of the little project, there wasn't really much follow-up in place in the curriculum. It was also a pretty different time, things wouldn't happen the same way now I don't think. Everyone would be able to relate what they did to the kind of things they see on their iPads, but the complexity of modern computing presents a higher barrier to entry as well. With the computers of the 1980s, you drove the machine directly, you could know everything about it, and getting it print text and ask questions and do computations was something you could do in seconds without having to open windows, include libraries, or compile things. But to the larger point about kids being able to teach themselves once sufficiently intrigued with whatever topic, I definitely have myself as anecdotal evidence that it works. I consider myself today to be relatively expert in many areas of contemporary computing, and I was back in the late 1980s as well, but I have only ever taken one structured course (a half semester on parallel programming in C), the rest of what I know all being through reading books/magazines/web sites and experimenting.